Theory X and theory Y

Results: 107



#Item
11Probability theory / Statistical dependence / Data analysis / Covariance / Variance / Correlation and dependence / Independence / Expected value / Sample mean and sample covariance / Statistics / Covariance and correlation / Algebra of random variables

Characterizing the relationship between two quantitative variables X and Y Ricco Rakotomalala

Add to Reading List

Source URL: eric.univ-lyon2.fr

Language: English - Date: 2014-12-12 04:24:31
12

2009 Paper 7 Question 11 Information Theory and Coding (a) Let X and Y be discrete random variables over state ensembles {x} and {y} having probability distributions p(x) and p(y), conditional probability distributions

Add to Reading List

Source URL: www.cl.cam.ac.uk

- Date: 2014-06-09 10:18:30
    13

    2007 Paper 8 Question 7 Information Theory and Coding (a) Suppose that X is a random variable whose entropy H(X) is 8 bits. Suppose that Y (X) is a deterministic function that takes on a different value for each value o

    Add to Reading List

    Source URL: www.cl.cam.ac.uk

    - Date: 2014-06-09 10:18:20
      14

      1997 Paper 8 Question 11 Information Theory and Coding The input source to a noisy communication channel is a random variable X over the four symbols a, b, c, d. The output from this channel is a random variable Y over

      Add to Reading List

      Source URL: www.cl.cam.ac.uk

      - Date: 2014-06-09 10:17:17
        15

        2006 Paper 7 Question 8 Information Theory and Coding (a) Give three different expressions for mutual information I(X; Y ) between two discrete random variables X and Y , in terms of their two conditional entropies H(X|

        Add to Reading List

        Source URL: www.cl.cam.ac.uk

        Language: English - Date: 2014-06-09 10:18:12
          16

          COMPUTER SCIENCE TRIPOS Part II – 2013 – Paper 9 6 Information Theory and Coding (JGD) (a) Two random variables X and Y are correlated. The marginal probabilities p(X) and p(Y ) are known, as is their joint probabili

          Add to Reading List

          Source URL: www.cl.cam.ac.uk

          Language: English - Date: 2014-06-09 10:18:42
            17

            1998 Paper 8 Question 11 Information Theory and Coding Consider a binary symmetric communication channel, having source alphabet X = {0, 1} with probabilities {0.5, 0.5}. Its output alphabet is Y = {0, 1} and its channe

            Add to Reading List

            Source URL: www.cl.cam.ac.uk

            Language: English - Date: 2014-06-09 10:17:24
              18

              Tomas Everaert Vrije Universiteit Brussel Monotone-light factorisation systems and torsion theories Given a torsion theory (Y, X) in an abelian category C, the reflector I : C → X to the torsion-free subcategory X indu

              Add to Reading List

              Source URL: web.science.mq.edu.au

              Language: English - Date: 2013-06-21 05:35:51
                19

                2004 Paper 8 Question 10 Information Theory and Coding (a) Consider a binary symmetric communication channel, whose input source is the alphabet X = {0, 1} with probabilities {0.5, 0.5}; whose output alphabet is Y = {0,

                Add to Reading List

                Source URL: www.cl.cam.ac.uk

                Language: English - Date: 2014-06-09 10:18:01
                  20Applied mathematics / Fourier analysis / Information theory / Digital signal processing / Sinc function / Integral transforms / Fourier transform / Rectangular function / Entropy / Mathematical analysis / Signal processing / Mathematics

                  2006 Paper 8 Question 16 Information Theory and Coding (a) Suppose we know the conditional entropy H(X|Y ) for two slightly correlated discrete random variables X and Y . We wish to guess the value of X, from knowledge

                  Add to Reading List

                  Source URL: www.cl.cam.ac.uk

                  Language: English - Date: 2014-06-09 10:18:14
                  UPDATE